Latent Gaussian Models for Topic Modeling
نویسندگان
چکیده
A new approach is proposed for topic modeling, in which the latent matrix factorization employs Gaussian priors, rather than the Dirichlet-class priors widely used in such models. The use of a latent-Gaussian model permits simple and efficient approximate Bayesian posterior inference, via the Laplace approximation. On multiple datasets, the proposed approach is demonstrated to yield results as accurate as state-of-the-art approaches based on Dirichlet constructions, at a small fraction of the computation. The framework is general enough to jointly model text and binary data, here demonstrated to produce accurate and fast results for joint analysis of voting rolls and the associated legislative text. Further, it is demonstrated how the technique may be scaled up to massive data, with encouraging performance relative to alternative methods.
منابع مشابه
Spatial Latent Gaussian Models: Application to House Prices Data in Tehran City
Latent Gaussian models are flexible models that are applied in several statistical applications. When posterior marginals or full conditional distributions in hierarchical Bayesian inference from these models are not available in closed form, Markov chain Monte Carlo methods are implemented. The component dependence of the latent field usually causes increase in computational time and divergenc...
متن کاملParameter Estimation in Spatial Generalized Linear Mixed Models with Skew Gaussian Random Effects using Laplace Approximation
Spatial generalized linear mixed models are used commonly for modelling non-Gaussian discrete spatial responses. We present an algorithm for parameter estimation of the models using Laplace approximation of likelihood function. In these models, the spatial correlation structure of data is carried out by random effects or latent variables. In most spatial analysis, it is assumed that rando...
متن کاملKernel Topic Models
Latent Dirichlet Allocation models discrete data as a mixture of discrete distributions, using Dirichlet beliefs over the mixture weights. We study a variation of this concept, in which the documents’ mixture weight beliefs are replaced with squashed Gaussian distributions. This allows documents to be associated with elements of a Hilbert space, admitting kernel topic models (KTM), modelling te...
متن کاملTensor Decomposition for Topic Models: An Overview and Implementation
The goal of a topic model is to characterize observed data in terms of a much smaller set of unobserved topics. Topic models have proven especially popular for information retrieval. Latent Dirichlet Allocation (LDA) is the most popular generative model used for topic modeling. Learning the optimal parameters of the LDA model efficiently, however, is an open question. As [2] point out, the trad...
متن کاملUnsupervised Topic Modeling for Short Texts Using Distributed Representations of Words
We present an unsupervised topic model for short texts that performs soft clustering over distributed representations of words. We model the low-dimensional semantic vector space represented by the dense distributed representations of words using Gaussian mixture models (GMMs) whose components capture the notion of latent topics. While conventional topic modeling schemes such as probabilistic l...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2014